9,284 research outputs found

    Expressive touch: Control of robot emotional expression by touch

    Get PDF
    In this paper, we present a work on control of robot emotional expression using touch sensing. A tactile Bayesian framework is proposed for recognition of different types of touch gestures. We include a sequential analysis method that, based on the accumulation of evidence from tactile interaction, allows to achieve accurate results for recognition of touch. Input data to our method is obtained from touch sensing, which is an important modality for social robotics. Here, emotion in the robot platform are represented by facial expressions, that are handled by a developed control architecture. We validate our method with experiments on tactile interaction in simulated and real robot environments. Results demonstrate that our proposed method is suitable and accurate for control of robot emotions through interaction with humans using touch sensing. Furthermore, it is demonstrated the potential that touch provides as a non-verbal communication channel for the development of social robots capable to interact with humans

    Adaptive perception: learning from sensory predictions to extract object shape with a biomimetic fingertip

    Get PDF
    In this work, we present an adaptive perception method to improve the performance in accuracy and speed of a tactile exploration task. This work extends our previous studies on sensorimotor control strategies for active tactile perception in robotics. First, we present the active Bayesian perception method to actively reposition a robot to accumulate evidence from better locations to reduce uncertainty. Second, we describe the adaptive perception method that, based on a forward model and a predicted information gain approach, allows to the robot to analyse `what would have happened' if a different decision `would have been made' at previous decision time. This approach permits to adapt the active Bayesian perception process to improve the performance in accuracy and reaction time of an exploration task. Our methods are validated with a contour following exploratory procedure with a touch sensor. The results show that the adaptive perception method allows the robot to make sensory predictions and autonomously adapt, improving the performance of the exploration task

    Active haptic shape recognition by intrinsic motivation with a robot hand

    Get PDF
    In this paper, we present an intrinsic motivation approach applied to haptics in robotics for tactile object exploration and recognition. Here, touch is used as the sensation process for contact detection, whilst proprioceptive information is used for the perception process. First, a probabilistic method is employed to reduce uncertainty present in tactile measurements. Second, the object exploration process is actively controlled by intelligently moving the robot hand towards interesting locations. The active behaviour performed with the robotic hand is achieved by an intrinsic motivation approach, which permitted to improve the accuracy for object recognition over the results obtained by a fixed sequence of exploration movements. The proposed method was validated in a simulated environment with a Monte Carlo method, whilst for the real environment a three-fingered robotic hand and various object shapes were employed. The results demonstrate that our method is robust and suitable for haptic perception in autonomous robotics

    Telepresence: Immersion with the iCub Humanoid Robot and the Oculus Rift

    Get PDF
    In this paper we present an architecture for the study of telepresence and human-robot interaction. The telepresence system uses the visual and gaze control systems of the iCub humanoid robot coupled with the Oculus Rift virtual reality system. The human is able to observe a remote location from the visual feedback displayed in the Oculus Rift. The exploration of the remote environment is achieved by controlling the eyes and head of the iCub humanoid robot with orientation information from human head movements. Our system was tested from various remote locations in a local network and through the internet, producing a smooth control of the robot. This provides a robust architecture for immersion of humans in a robotic system for remote observation and exploration of the environment

    Feeling the Shape: Active Exploration Behaviors for Object Recognition With a Robotic Hand

    Get PDF
    Autonomous exploration in robotics is a crucial feature to achieve robust and safe systems capable to interact with and recognize their surrounding environment. In this paper, we present a method for object recognition using a three-fingered robotic hand actively exploring interesting object locations to reduce uncertainty. We present a novel probabilistic perception approach with a Bayesian formulation to iteratively accumulate evidence from robot touch. Exploration of better locations for perception is performed by familiarity and novelty exploration behaviors, which intelligently control the robot hand to move toward locations with low and high levels of interestingness, respectively. These are active behaviors that, similar to the exploratory procedures observed in humans, allow robots to autonomously explore locations they believe that contain interesting information for recognition. Active behaviors are validated with object recognition experiments in both offline and real-time modes. Furthermore, the effects of inhibiting the active behaviors are analyzed with a passive exploration strategy. The results from the experiments demonstrate the accuracy of our proposed methods, but also their benefits for active robot control to intelligently explore and interact with the environment

    A SOLID case for active bayesian perception in robot touch

    Get PDF
    In a series of papers, we have formalized a Bayesian perception approach for robotics based on recent progress in understanding animal perception. The main principle is to accumulate evidence for multiple perceptual alternatives until reaching a preset belief threshold, formally related to sequential analysis methods for optimal decision making. Here, we extend this approach to active perception, by moving the sensor with a control strategy that depends on the posterior beliefs during decision making. This method can be used to solve problems involving Simultaneous Object Localization and IDentification (SOLID), or 'where and what'. Considering an example in robot touch, we find that active perception gives an efficient, accurate solution to the SOLID problem for uncertain object locations; in contrast, passive Bayesian perception, which lacked sensorimotor feedback, then performed poorly. Thus, active perception can enable robust sensing in unstructured environments. © 2013 Springer-Verlag Berlin Heidelberg

    Multisensory wearable interface for immersion and telepresence in robotics

    Get PDF
    The idea of being present in a remote location has inspired researchers to develop robotic devices that make humans to experience the feeling of telepresence. These devices need of multiple sensory feedback to provide a more realistic telepresence experience. In this work, we develop a wearable interface for immersion and telepresence that provides to human with the capability of both to receive multisensory feedback from vision, touch and audio and to remotely control a robot platform. Multimodal feedback from a remote environment is based on the integration of sensor technologies coupled to the sensory system of the robot platform. Remote control of the robot is achieved by a modularised architecture, which allows to visually explore the remote environment. We validated our work with multiple experiments where participants, located at different venues, were able to successfully control the robot platform while visually exploring, touching and listening a remote environment. In our experiments we used two different robotic platforms: the iCub humanoid robot and the Pioneer LX mobile robot. These experiments show that our wearable interface is comfortable, easy to use and adaptable to different robotic platforms. Furthermore, we observed that our approach allows humans to experience a vivid feeling of being present in a remote environment

    Active touch for robust perception under position uncertainty

    Get PDF
    In this paper, we propose that active perception will help attain autonomous robotics in unstructured environments by giving robust perception. We test this claim with a biomimetic fingertip that senses surface texture under a range of contact depths. We compare the performance of passive Bayesian perception with a novel approach for active perception that includes a sensorimotor loop for controlling sensor position. Passive perception at a single depth gave poor results, with just 0.2mm uncertainty impairing performance. Extending passive perception over a range of depths gave non-robust performance. Only active perception could give robust, accurate performance, with the sensorimotor feedback compensating the position uncertainty. We expect that these results will extend to other stimuli, so that active perception will offer a general approach to robust perception in unstructured environments

    Bayesian perception of touch for control of robot emotion

    Get PDF
    In this paper, we present a Bayesian approach for perception of touch and control of robot emotion. Touch is an important sensing modality for the development of social robots, and it is used in this work as stimulus through a human-robot interaction. A Bayesian framework is proposed for perception of various types of touch. This method together with a sequential analysis approach allow the robot to accumulate evidence from the interaction with humans to achieve accurate touch perception for adaptable control of robot emotions. Facial expressions are used to represent the emotions of the iCub humanoid. Emotions in the robotic platform, based on facial expressions, are handled by a control architecture that works with the output from the touch perception process. We validate the accuracy of our system with simulated and real robot touch experiments. Results from this work show that our method is suitable and accurate for perception of touch to control robot emotions, which is essential for the development of sociable robots

    Simultaneous Bayesian recognition of locomotion and gait phases with wearable sensors

    Get PDF
    Recognition of movement is a crucial process to assist humans in activities of daily living, such as walking. In this work, a high-level method for the simultaneous recognition of locomotion and gait phases using wearable sensors is presented. A Bayesian formulation is employed to iteratively accumulate evidence to reduce uncertainty, and to improve the recognition accuracy. This process uses a sequential analysis method to autonomously make decisions, whenever the recognition system perceives that there is enough evidence accumulated. We use data from three wearable sensors, attached to the thigh, shank, and foot of healthy humans. Level-ground walking, ramp ascent and descent activities are used for data collection and recognition. In addition, an approach for segmentation of the gait cycle for recognition of stance and swing phases is presented. Validation results show that the simultaneous Bayesian recognition method is capable to recognize walking activities and gait phases with mean accuracies of 99.87% and 99.20%. This process requires a mean of 25 and 13 sensor samples to make a decision for locomotion mode and gait phases, respectively. The recognition process is analyzed using different levels of confidence to show that our method is highly accurate, fast, and adaptable to specific requirements of accuracy and speed. Overall, the simultaneous Bayesian recognition method demonstrates its benefits for recognition using wearable sensors, which can be employed to provide reliable assistance to humans in their walking activities
    • …
    corecore